Minimizing GCV/GML Scores with Multiple Smoothing Parameters via the Newton Method
نویسندگان
چکیده
The (modified) Newton method is adapted to optimize generalized cross validation (GCV) and generalized maximum likelihood (GML) scores with multiple smoothing parameters. The main concerns in solving the optimization problem are the speed and the reliability of the algorithm, as well as the invariance of the algorithm under transformations under which the problem itself is invariant. The proposed algorithm is believed to be highly efficient for the problem, though it is still rather expensive for large data sets, since its operational counts are (2/3)kn + O(n2), with k the number of smoothing parameters and n the number of observations. Sensible procedures for computing good starting values are also proposed, which should help in keeping the execution load to the minimum possible. The algorithm is implemented in Rkpack [RKPACK and its applications: Fitting smoothing spline models, Tech. Report 857, Department of Statistics, University of Wisconsin, Madison, WI, 1989] and illustrated by examples of fitting additive and interaction spline models. It is noted that the algorithm can also be applied to the maximum likelihood (ML) and the restricted maximum likelihood (REML) estimation of the variance component models. Key words, additive/interaction spline models, gradient, Hessian, invariance, Newton method, smoothing parameters, starting values AMS(MOS) subject classifications. 65D07, 65D10, 65D15, 65K10, 65V05
منابع مشابه
Spline Smoothing With Correlated Random Errors
Spline smoothing provides a powerful tool for estimating nonparametric functions. Most of the past work is based on the assumption that the random errors are independent. Observations are often correlated in applications; e.g., time series data, spatial data and clustered data. It is well known that correlation greatly a ects the selection of smoothing parameters, which are critical to the perf...
متن کاملSmoothing Spline Estimation of Variance Functions
This article considers spline smoothing of variance functions. We focus on selection of smoothing parameters and develop three direct data-driven methods: unbiased risk (UBR), generalized approximate cross validation (GACV) and generalized maximum likelihood (GML). In addition to guaranteed convergence, simulations show that these direct methods perform better than existing indirect UBR, genera...
متن کاملSmoothing Spline Semi-parametric Nonlinear Regression Models
We consider the problem of modeling the mean function in regression. Often there is enough knowledge to model some components of the mean function parametrically. But for other vague and/or nuisance components, it is often desirable to leave them unspecified and to be modeled nonparametrically. In this article, we propose a general class of smoothing spline semi-parametric nonlinear regression ...
متن کاملSpline Smoothing for Bivariate Data with Applications to Association between Hormones
In this paper penalized weighted least-squares is used to jointly estimate nonparametric functions from contemporaneously correlated data. Under conditions generally encountered in practice, it is shown that these joint estimates have smaller posterior variances than those of marginal estimates and are therefore more efficient. We describe three methods: generalized maximum likelihood (GML), ge...
متن کاملFast stable REML and ML estimation of semiparametric GLMs
Recent work by Reiss and Ogden (2009) provides a theoretical basis for sometimes preferring restricted maximum likelihood (REML) to generalized cross validation (GCV) for smoothing parameter selection in semiparametric regression. However, existing REML or marginal likelihood (ML) based methods for semiparametric GLMs use iterative REML/ML estimation of the smoothing parameters of working linea...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- SIAM J. Scientific Computing
دوره 12 شماره
صفحات -
تاریخ انتشار 1991